Gaussian Process Convolutional Dictionary Learning

نویسندگان

چکیده

Convolutional dictionary learning (CDL), the problem of estimating shift-invariant templates from data, is typically conducted in absence a prior/structure on templates. In data-scarce or low signal-to-noise ratio (SNR) regimes, learned overfit data and lack smoothness, which can affect predictive performance downstream tasks. To address this limitation, we propose GPCDL, convolutional framework that enforces priors using Gaussian Processes (GPs). With focus show theoretically imposing GP prior equivalent to Wiener filtering templates, thereby suppressing high-frequency components promoting smoothness. We algorithm simple extension classical iteratively reweighted least squares algorithm, independent choice kernels. This property allows one experiment flexibly with different smoothness assumptions. Through simulation, GPCDL learns smooth dictionaries better accuracy than unregularized alternative across range SNRs. an application neural spiking more accurate visually-interpretable dictionary, leading superior compared non-regularized CDL, as well parametric alternatives.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convolutional Dictionary Learning

Convolutional sparse representations are a form of sparse representation with a dictionary that has a structure that is equivalent to convolution with a set of linear filters. While effective algorithms have recently been developed for the convolutional sparse coding problem, the corresponding dictionary learning problem is substantially more challenging. Furthermore, although a number of diffe...

متن کامل

Learning Dependent Dictionary Representation with Efficient Multiplicative Gaussian Process

In dictionary learning for analysis of images, spatial correlation from extracted patches can be leveraged to improve characterization power. We propose a Bayesian framework for dictionary learning, where spatial location dependencies are captured by imposing a multiplicative Gaussian process prior on the latent units representing binary activations. Data augmentation and Kronecker methods allo...

متن کامل

Convolutional Dictionary Learning through Tensor Factorization

Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. However, in many domains, additional invariances such as shift invariances exist, enforced via models such as convolu...

متن کامل

First and Second Order Methods for Online Convolutional Dictionary Learning

Convolutional sparse representations are a form of sparse representation with a structured, translation invariant dictionary. Most convolutional dictionary learning algorithms to date operate in batch mode, requiring simultaneous access to all training images during the learning process, which results in very high memory usage, and severely limits the training data that can be used. Very recent...

متن کامل

Multi-Layer Convolutional Sparse Modeling: Pursuit and Dictionary Learning

The recently proposed Multi-Layer Convolutional Sparse Coding (ML-CSC) model, consisting of a cascade of convolutional sparse layers, provides a new interpretation of Convolutional Neural Networks (CNNs). Under this framework, the computation of the forward pass in a CNN is equivalent to a pursuit algorithm aiming to estimate the nested sparse representation vectors – or feature maps – from a g...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Signal Processing Letters

سال: 2021

ISSN: ['1558-2361', '1070-9908']

DOI: https://doi.org/10.1109/lsp.2021.3127471